import { Tabs, TabItem } from '@aws-amplify/ui-react';
import { ExampleCode } from '@/components/Example';
import { Fragment } from '@/components/Fragment';
{({}) => import(`./liveness-service-setup.mdx`)}
### Step 2. Install dependencies
FaceLivenessDetector component is built using [Jetpack Compose](https://developer.android.com/jetpack/compose). Enable Jetpack Compose by adding the following to the `android` section of your **app**'s `build.gradle` file:
```groovy
buildFeatures {
compose true
}
composeOptions {
kotlinCompilerExtensionVersion '1.2.0'
}
```
Add the following dependencies to your **app**'s `build.gradle` file and click "Sync Now" when prompted:
```groovy
dependencies {
// FaceLivenessDetector dependency
implementation 'com.amplifyframework.ui:liveness:1.1.1'
// Material3 dependency for theming FaceLivenessDetector
implementation 'androidx.compose.material3:material3:1.1.0'
}
```
### Step 3. Request camera permissions
FaceLivenessDetector requires access to the camera on the user's device in order to perform the Face Liveness check. Before displaying FaceLivenessDetector, prompt the user to grant camera permission. Please follow these guides for examples of requesting camera permission using either [Android](https://developer.android.com/training/permissions/requesting) or [Jetpack Compose](https://google.github.io/accompanist/permissions/).
### Step 4. Add FaceLivenessDetector
In the `onCreate` of your app's `MainActivity`, add the following code to display FaceLivenessDetector, replacing `` with the session ID returned from creating the Face Liveness session and replacing `` with the region you would like to use for the Face Liveness check. The list of supported regions is in the [Amazon Rekognition Face Liveness developer guide](https://docs.aws.amazon.com/general/latest/gr/rekognition.html). The code below wraps FaceLivenessDetector in a MaterialTheme that uses the Face Liveness color scheme. More information about theming is in the [Face Liveness Customization page](./liveness/customization).
```kotlin
setContent {
MaterialTheme(
colorScheme = LivenessColorScheme.default()
) {
FaceLivenessDetector(
sessionId = ,
region = ,
onComplete = {
Log.i("MyApp", "Face Liveness flow is complete")
// The Face Liveness flow is complete and the session
// results are ready. Use your backend to retrieve the
// results for the Face Liveness session.
},
onError = { error ->
Log.e("MyApp", "Error during Face Liveness flow", error)
// An error occurred during the Face Liveness flow, such as
// time out or missing the required permissions.
}
)
}
}
```
FaceLivenessDetector must be created in Kotlin but can still be used in a Java-based app. First, create a new Kotlin file called `MyView` and add the following code to create FaceLivenessDetector, replacing `` with the session ID returned from creating the Face Liveness session and replacing `` with the region you would like to use for the Face Liveness check. The list of supported regions is in the [Amazon Rekognition Face Liveness developer guide](https://docs.aws.amazon.com/general/latest/gr/rekognition.html). The code below wraps FaceLivenessDetector in a MaterialTheme that uses the Liveness color scheme. More information about theming is in the [Liveness Customization page](./liveness/customization).
```kotlin
object MyView {
fun setViewContent(activity: ComponentActivity) {
activity.setContent {
MaterialTheme(
colorScheme = LivenessColorScheme.default()
) {
FaceLivenessDetector(
sessionId = ,
region = ,
onComplete = {
Log.i("MyApp", "Face Liveness flow is complete")
// The Face Liveness flow is complete and the
// session results are ready. Use your backend to
// retrieve the results for the Face Liveness session.
},
onError = { error ->
Log.e("MyApp", "Error during Face Liveness flow", error)
// An error occurred during the Face Liveness flow, such
// as time out or missing the required permissions.
}
)
}
}
}
}
```
In the `onCreate` of your app's `MainActivity`, add the following code to display FaceLivenessDetector:
```java
MyView.setViewContent(this);
```